171 research outputs found
Influence of firing mechanisms on gain modulation
We studied the impact of a dynamical threshold on the f-I curve-the
relationship between the input and the firing rate of a neuron-in the presence
of background synaptic inputs. First, we found that, while the leaky
integrate-and-fire model cannot reproduce the f-I curve of a cortical neuron,
the leaky integrate-and-fire model with dynamical threshold can reproduce it
very well. Second, we found that the dynamical threshold modulates the onset
and the asymptotic behavior of the f-I curve. These results suggest that a
cortical neuron has an adaptation mechanism and that the dynamical threshold
has some significance for the computational properties of a neuron.Comment: 7 pages, 4 figures, conference proceeding
Quaternion algebras with the same subfields
G. Prasad and A. Rapinchuk asked if two quaternion division F -algebras that
have the same subfields are necessarily isomorphic. The answer is known to be
"no" for some very large fields. We prove that the answer is "yes" if F is an
extension of a global field K so that F /K is unirational and has zero
unramified Brauer group. We also prove a similar result for Pfister forms and
give an application to tractable fields
Adaptation Reduces Variability of the Neuronal Population Code
Sequences of events in noise-driven excitable systems with slow variables
often show serial correlations among their intervals of events. Here, we employ
a master equation for general non-renewal processes to calculate the interval
and count statistics of superimposed processes governed by a slow adaptation
variable. For an ensemble of spike-frequency adapting neurons this results in
the regularization of the population activity and an enhanced post-synaptic
signal decoding. We confirm our theoretical results in a population of cortical
neurons.Comment: 4 pages, 2 figure
Onset of negative interspike interval correlations in adapting neurons
Negative serial correlations in single spike trains are an effective method
to reduce the variability of spike counts. One of the factors contributing to
the development of negative correlations between successive interspike
intervals is the presence of adaptation currents. In this work, based on a
hidden Markov model and a proper statistical description of conditional
responses, we obtain analytically these correlations in an adequate dynamical
neuron model resembling adaptation. We derive the serial correlation
coefficients for arbitrary lags, under a small adaptation scenario. In this
case, the behavior of correlations is universal and depends on the first-order
statistical description of an exponentially driven time-inhomogeneous
stochastic process.Comment: 12 pages (10 pages in the journal version), 6 figures, published in
Phys. Rev. E; http://link.aps.org/doi/10.1103/PhysRevE.84.04190
Balanced Synaptic Input Shapes the Correlation between Neural Spike Trains
Stimulus properties, attention, and behavioral context influence correlations between the spike times produced by a pair of neurons. However, the biophysical mechanisms that modulate these correlations are poorly understood. With a combined theoretical and experimental approach, we show that the rate of balanced excitatory and inhibitory synaptic input modulates the magnitude and timescale of pairwise spike train correlation. High rate synaptic inputs promote spike time synchrony rather than long timescale spike rate correlations, while low rate synaptic inputs produce opposite results. This correlation shaping is due to a combination of enhanced high frequency input transfer and reduced firing rate gain in the high input rate state compared to the low state. Our study extends neural modulation from single neuron responses to population activity, a necessary step in understanding how the dynamics and processing of neural activity change across distinct brain states
Collaborative Brain-Computer Interface for Aiding Decision-Making
We look at the possibility of integrating the percepts from multiple non-communicating observers as a means of achieving better joint perception and better group decisions. Our approach involves the combination of a brain-computer interface with human behavioural responses. To test ideas in controlled conditions, we asked observers to perform a simple matching task involving the rapid sequential presentation of pairs of visual patterns and the subsequent decision as whether the two patterns in a pair were the same or different. We recorded the response times of observers as well as a neural feature which predicts incorrect decisions and, thus, indirectly indicates the confidence of the decisions made by the observers. We then built a composite neuro-behavioural feature which optimally combines the two measures. For group decisions, we uses a majority rule and three rules which weigh the decisions of each observer based on response times and our neural and neuro-behavioural features. Results indicate that the integration of behavioural responses and neural features can significantly improve accuracy when compared with the majority rule. An analysis of event-related potentials indicates that substantial differences are present in the proximity of the response for correct and incorrect trials, further corroborating the idea of using hybrids of brain-computer interfaces and traditional strategies for improving decision making
Impact of network structure and cellular response on spike time correlations
Novel experimental techniques reveal the simultaneous activity of larger and
larger numbers of neurons. As a result there is increasing interest in the
structure of cooperative -- or correlated -- activity in neural populations,
and in the possible impact of such correlations on the neural code. A
fundamental theoretical challenge is to understand how the architecture of
network connectivity along with the dynamical properties of single cells shape
the magnitude and timescale of correlations. We provide a general approach to
this problem by extending prior techniques based on linear response theory. We
consider networks of general integrate-and-fire cells with arbitrary
architecture, and provide explicit expressions for the approximate
cross-correlation between constituent cells. These correlations depend strongly
on the operating point (input mean and variance) of the neurons, even when
connectivity is fixed. Moreover, the approximations admit an expansion in
powers of the matrices that describe the network architecture. This expansion
can be readily interpreted in terms of paths between different cells. We apply
our results to large excitatory-inhibitory networks, and demonstrate first how
precise balance --- or lack thereof --- between the strengths and timescales of
excitatory and inhibitory synapses is reflected in the overall correlation
structure of the network. We then derive explicit expressions for the average
correlation structure in randomly connected networks. These expressions help to
identify the important factors that shape coordinated neural activity in such
networks
- …